AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Low energy consumption

# Low energy consumption

Bitnet B1.58 2B 4T Gguf
MIT
The first open-source, native 1-bit large language model developed by Microsoft Research, with a parameter scale of 2 billion, trained on a corpus of 4 trillion tokens.
Large Language Model English
B
microsoft
25.77k
143
Bitnet B1.58 2B 4T Bf16
MIT
An open-source native 1-bit large language model developed by Microsoft Research, with 2 billion parameters trained on a 4 trillion token corpus, significantly improving computational efficiency.
Large Language Model Transformers English
B
microsoft
2,968
24
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase